Boltzmann-Gibbs-Shannon entropy

Boltzmann-Gibbs-Shannon entropy
Макаров: энтропия Больцмана-Гиббса-Шеннона

Универсальный англо-русский словарь. . 2011.

Игры ⚽ Поможем написать курсовую

Смотреть что такое "Boltzmann-Gibbs-Shannon entropy" в других словарях:

  • History of entropy — The concept of entropy developed in response to the observation that a certain amount of functional energy released from combustion reactions is always lost to dissipation or friction and is thus not transformed into useful work . Early heat… …   Wikipedia

  • Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… …   Wikipedia

  • Entropy in thermodynamics and information theory — There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S , of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s; and the …   Wikipedia

  • Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… …   Wikipedia

  • Entropy of mixing — The entropy of mixing is the change in the configuration entropy, an extensive thermodynamic quantity, when two different chemical substances or components are mixed. This entropy change must be positive since there is more uncertainty about the… …   Wikipedia

  • Gibbs entropy — In thermodynamics, specifically in statistical mechanics, the Gibbs entropy formula is the standard formula for calculating the statistical mechanical entropy of a thermodynamic system,: S = k ext{B} sum i p i ln p i , (1)where k B is the… …   Wikipedia

  • Entropy (general concept) — In many branches of science, entropy refers to a certain measure of the disorder of a system. Entropy is particularly notable as it has a broad, common definition that is shared across physics, mathematics and information science. Although the… …   Wikipedia

  • Maximum entropy thermodynamics — In physics, maximum entropy thermodynamics (colloquially, MaxEnt thermodynamics) views equilibrium thermodynamics and statistical mechanics as inference processes. More specifically, MaxEnt applies inference techniques rooted in Shannon… …   Wikipedia

  • Von Neumann entropy — In quantum statistical mechanics, von Neumann entropy refers to the extension of classical entropy concepts to the field of quantum mechanics.John von Neumann rigorously established the correct mathematical framework for quantum mechanics with… …   Wikipedia

  • Principle of maximum entropy — This article is about the probability theoretic principle. For the classifier in machine learning, see maximum entropy classifier. For other uses, see maximum entropy (disambiguation). Bayesian statistics Theory Bayesian probability Probability… …   Wikipedia

  • Ludwig Boltzmann — Infobox Scientist name = Ludwig Boltzmann |225px image width = 225px caption = Ludwig Eduard Boltzmann (1844 1906) birth date = birth date|1844|2|20|mf=y birth place= Vienna, Austrian Empire death date = death date and age|1906|9|5|1844|2|20|mf=y …   Wikipedia


Поделиться ссылкой на выделенное

Прямая ссылка:
Нажмите правой клавишей мыши и выберите «Копировать ссылку»